paint-brush
Top 10 Reasons To Love Dockerby@appfleet
170 reads

Top 10 Reasons To Love Docker

by appfleetJanuary 23rd, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Docker is a tool to help with application packaging and deployment. It's not until you start using it, however, that some of the other benefits that developers love so much show themselves. Here, to discover why this tool has become so popular, here are 10 Reasons Why Developers Love Docker. Developers can have their application Docker images built once, published to a registry, then deployed to whatever environments are required. Developers love this because it promotes reuse and less duplication, one of the key concepts they apply to write code.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Top 10 Reasons To Love Docker
appfleet HackerNoon profile picture

When you first look into Docker, what it does, and how it works, it appears to be a neat tool to help with application packaging and deployment. It's not until you start using it, however, that some of the other benefits that developers love so much show themselves. So, to discover why this tool has become so popular, here are Ten Reasons Why Developers Love Docker.

1. Docker makes packaging applications easier

The traditional deployment methodology involves pushing an application artifact onto a server, then running it. This also includes making sure to setup whatever libraries and file structure that are required by the application.

Docker flips this on its head. A Docker image should include everything required to run an application, including the application server, dependencies, file structure, as well at the application itself. This empowers developers to be responsible for more than just development, which makes sense as it's their code after all.

2. Docker images are built once and run anywhere

Docker images are built once and run anywhere. Or more specifically, anywhere that you can run Docker.

This makes deployment more predictable, as the same image is getting deployed to your production and test environments. This takes any surprises out of the equation, as we can be sure that the application and it's dependencies will be deployed in the same way.

3. Docker images are extensible

When someone publishes a Docker image, not only is it available for anyone that has access to run it, but it can also be extended to add more functionality. Developers love this because it promotes reuse and less duplication, one of the key concepts they apply to write code.

As an example, let's say we wanted to use the popular lightweight Linux distribution Alpine Linux to run the curl command to fetch a specific URL. By default, Alpine Linux doesn't include curl, but we can easily extend the image to add it in, like so:

FROM alpine
RUN apk add curl

If we built and ran this image, we'd get all the functionality of Alpine Linux with the addition of curl.

4. Most popular applications are already available in Docker

If you think of an open-source application that you want to run in Docker, the likelihood is that a Docker image for it is already available to download. This would normally be from Docker Hub, one of the most popular central registries of Docker images.

This makes developers' lives easier because they can quickly get everything they need to run an application started in their local development environment.

Take the popular database technology Postgres, for example. All you have to do is run "docker run postgres" and you have an instance of Postgres available, ready for your application to connect to.

5. Docker containers are...containers

It might sound obvious, but one of the huge advantages for developers of using Docker is the containerized nature of them. Broadly, this means that:

  • Processes running in containers cannot see other processes on the same host
  • Processes running in containers are constrained in how much memory and CPU they can use
  • Containers can only be reached on specified ports
  • A container's file system is separate to that of the host, and gets cleaned up when the container is removed

All of these points add up to the feeling that when you're using Docker, the containers are very much self-contained and throwaway. If you want to have a brand-new development environment, just stop the containers and start again.

Photo by chuttersnap / Unsplash

6. Sharing Docker images is easy with a Docker registry

A Docker Registry is a service that is used for distributing Docker images. It's kind of like what Maven Central is to Maven artifacts. One of the most well known Docker registries is Docker Hub, where you can find all sorts of publicly available images. Some alternatives include:

  • Google Container Registry
  • Amazon Elastic Container Registry
  • JFrog Artifactory

Of course, all of these options allow you to store private images that you only want authorized users to have access to. This means that developers can have their application Docker images built once, published to a registry, then deployed to whatever environments are required.

7. Docker containers make the development environment more like production

We're already touched on the fact that Docker makes packaging applications easier, but this has several other implications for developers.

One of the main ones is that if there's a problem with the production application, they can have the same application running in their development environment in seconds, by pulling the relevant versioned image from whatever Docker Registry it's been published to. Obviously the Docker container they're running also includes all the same dependencies and file structure as production, meaning less chance of those pesky production-only bugs.

On the flip side, this also means that when the application gets tested during it's path to production, it's less likely that we'll have a bug in production that didn't manifest in the test environment.

8. Docker Compose makes launching whole environments a breeze for developers

Docker Compose is a tool bundled with Docker, which allows you to start whatever applications you need using a simple YAML or JSON specification.

You define the different applications that should be started up, along with other configurations such as networking. So, if an application you're developing depends on a database, or various other applications, it's a one step process to get this all running.

Here's an example Docker Compose file that defines an instance of Prometheus and Grafana, the popular monitoring and graphing applications:

version: "3"
services:
  prometheus:
    image: prom/prometheus:latest
    ports:
      - 9090:9090
  grafana:
    image: grafana/grafana
    ports:
      - 3000:3000
    depends_on:
      - prometheus      

To start these applications you would just run "

docker-compose up
", then you'd have an instance of Prometheus running on
http://localhost:9090
and Grafana on 
http://localhost:3000
.

All of this can be committed into your version control system, making the developer's environment a lot more deterministic and predictable.

9. Harnessing the power of Docker speeds up the development workflow

One of the developer's main pain points is the length of time it takes from writing the code to it being delivered into production. This can sometimes be down to slow cumbersome continuous integration (CI) processes that take ages to run tests.

Docker can help here, as a long-running test suite can be split up and run across several Docker containers. CI tools such as Jenkins make this easy, as you can configure it to spawn jobs on popular Docker orchestration frameworks such as Kubernetes and Amazon ECS.

As an example, if your testing process included the following stages, you could run them all in parallel in Docker:

  • unit tests
  • integration tests
  • performance tests

10. Docker orchestration services provide production-ready capabilities

Once we move to a production environment, high-availability and scalability become big concerns. It's all well and good running a single Docker container for our application in a test environment, but your customers probably won't appreciate it in production.

Fortunately, there are several Docker container orchestration frameworks available that take care of all the heavy lifting when it comes to scaling, deploying, and managing your application. Here are a few:

  • Docker Swarm is included with Docker and allows you to run Docker containers across multiple swarm nodes
  • Kubernetes is probably the most popular framework, proving a large amount of customisation capabilities
  • AWS ECS is Amazon's Elastic Container Service. While not as customisable as Kubernetes, it is easy to get started and integrates well into other AWS services.

Features like auto-scaling, rolling deployment, and rollbacks mean that developers can sleep a little better at night knowing that if problems do happen, to some extent the framework can take of things.

Conclusion

Hopefully you can now understand a developer's point of view a little better when they tell you how much they love Docker.

It's no surprise then, that in a recent survey of its customers by Datadog, approximately 25% were already using Docker. With popular cloud providers such as Amazon and Google offering ever more seamless integration with Docker, we can expect the uptake to rise higher still.